Goto

Collaborating Authors

 bob coecke


Efficient Generation of Parameterised Quantum Circuits from Large Texts

Krawchuk, Colin, Khatri, Nikhil, Ortega, Neil John, Kartsaklis, Dimitri

arXiv.org Artificial Intelligence

Quantum approaches to natural language processing (NLP) are redefining how linguistic information is represented and processed. While traditional hybrid quantum-classical models rely heavily on classical neural networks, recent advancements propose a novel framework, DisCoCirc, capable of directly encoding entire documents as parameterised quantum circuits (PQCs), besides enjoying some additional interpretability and compositionality benefits. Following these ideas, this paper introduces an efficient methodology for converting large-scale texts into quantum circuits using tree-like representations of pregroup diagrams. Exploiting the compositional parallels between language and quantum mechanics, grounded in symmetric monoidal categories, our approach enables faithful and efficient encoding of syntactic and discourse relationships in long and complex texts (up to 6410 words in our experiments) to quantum circuits. The developed system is provided to the community as part of the augmented open-source quantum NLP package lambeq Gen II.


Towards Compositional Interpretability for XAI

Tull, Sean, Lorenz, Robin, Clark, Stephen, Khan, Ilyas, Coecke, Bob

arXiv.org Artificial Intelligence

Artificial intelligence (AI) is currently based largely on black-box machine learning models which lack interpretability. The field of eXplainable AI (XAI) strives to address this major concern, being critical in high-stakes areas such as the finance, legal and health sectors. We present an approach to defining AI models and their interpretability based on category theory. For this we employ the notion of a compositional model, which sees a model in terms of formal string diagrams which capture its abstract structure together with its concrete implementation. This comprehensive view incorporates deterministic, probabilistic and quantum models. We compare a wide range of AI models as compositional models, including linear and rule-based models, (recurrent) neural networks, transformers, VAEs, and causal and DisCoCirc models. Next we give a definition of interpretation of a model in terms of its compositional structure, demonstrating how to analyse the interpretability of a model, and using this to clarify common themes in XAI. We find that what makes the standard 'intrinsically interpretable' models so transparent is brought out most clearly diagrammatically. This leads us to the more general notion of compositionally-interpretable (CI) models, which additionally include, for instance, causal, conceptual space, and DisCoCirc models. We next demonstrate the explainability benefits of CI models. Firstly, their compositional structure may allow the computation of other quantities of interest, and may facilitate inference from the model to the modelled phenomenon by matching its structure. Secondly, they allow for diagrammatic explanations for their behaviour, based on influence constraints, diagram surgery and rewrite explanations. Finally, we discuss many future directions for the approach, raising the question of how to learn such meaningfully structured models in practice.


Physicist Bob Coecke: 'It's easier to convince kids than adults about quantum mechanics'

The Guardian

Belgian physicist and musician Prof Bob Coecke, 55, wants to teach quantum physics to a mass audience. The paradox-filled theory that describes the microscopic realm has become a staple of science fiction, from Marvel's Ant-Man to the multiple Oscar-winning Everything Everywhere All at Once. It's famously bizarre and, in the UK, the subject is mostly reserved for undergraduates specialising in physics because it requires grappling with complicated maths. But Coecke, a former Oxford professor, has devised a maths-free framework using diagrams for total beginners, outlined in Quantum in Pictures, his book with Dr Stefano Gogioso that was published earlier this year. Over the summer, they ran an education experiment, teaching the pictorial method to UK schoolchildren – who then beat the average exam scores of Oxford University's postgraduate physics students.


Towards Transparency in Coreference Resolution: A Quantum-Inspired Approach

Wazni, Hadi, Sadrzadeh, Mehrnoosh

arXiv.org Artificial Intelligence

Guided by grammatical structure, words compose to form sentences, and guided by discourse structure, sentences compose to form dialogues and documents. The compositional aspect of sentence and discourse units is often overlooked by machine learning algorithms. A recent initiative called Quantum Natural Language Processing (QNLP) learns word meanings as points in a Hilbert space and acts on them via a translation of grammatical structure into Parametrised Quantum Circuits (PQCs). Previous work extended the QNLP translation to discourse structure using points in a closure of Hilbert spaces. In this paper, we evaluate this translation on a Winograd-style pronoun resolution task. We train a Variational Quantum Classifier (VQC) for binary classification and implement an end-to-end pronoun resolution system. The simulations executed on IBMQ software converged with an F1 score of 87.20%. The model outperformed two out of three classical coreference resolution systems and neared state-of-the-art SpanBERT. A mixed quantum-classical model yet improved these results with an F1 score increase of around 6%.


Book review: 'Quantum in Pictures'

Oxford Comp Sci

The latest work by computer scientists Bob Coecke and Stefano Gogioso, 'Quantum in Pictures', aims to make the quantum world more accessible and inclusive. So, whether you're a high school student or a science enthusiast, the authors are confident that anyone mastering the tools in the book will gain an understanding equivalent to that of a quantum mechanics graduate at university. But what if a complete novice in quantum computing, i.e., this reviewer, could gain a genuine understanding of the field by simply reading this book? Let's test this out, shall we? Full disclosure from the get-go, I have absolutely no prior knowledge or expertise in quantum computing, therefore Coecke and Gogioso's latest research and book is not only worthy of a review but also a lesson for someone who barely scraped a C in GSCE Maths – a learning curve, if you will. For context, 'Quantum in Pictures' is the brainchild of Quantinuum's chief scientist Professor Bob Coecke and Dr Stefano Gogioso of Oxford University. The book introduces a formalism for quantum mechanics based on using'ZX-calculus' (or'ZX'), to describe quantum processes.


Differentiating and Integrating ZX Diagrams with Applications to Quantum Machine Learning

Wang, Quanlong, Yeung, Richie, Koch, Mark

arXiv.org Artificial Intelligence

ZX-calculus has proved to be a useful tool for quantum technology with a wide range of successful applications. Most of these applications are of an algebraic nature. However, other tasks that involve differentiation and integration remain unreachable with current ZX techniques. Here we elevate ZX to an analytical perspective by realising differentiation and integration entirely within the framework of ZX-calculus. We explicitly illustrate the new analytic framework of ZX-calculus by applying it in context of quantum machine learning for the analysis of barren plateaus.


Quantum Natural Language Generation on Near-Term Devices

Karamlou, Amin, Pfaffhauser, Marcel, Wootton, James

arXiv.org Artificial Intelligence

The emergence of noisy medium-scale quantum devices has led to proof-of-concept applications for quantum computing in various domains. Examples include Natural Language Processing (NLP) where sentence classification experiments have been carried out, as well as procedural generation, where tasks such as geopolitical map creation, and image manipulation have been performed. We explore applications at the intersection of these two areas by designing a hybrid quantum-classical algorithm for sentence generation. Our algorithm is based on the well-known simulated annealing technique for combinatorial optimisation. An implementation is provided and used to demonstrate successful sentence generation on both simulated and real quantum hardware. A variant of our algorithm can also be used for music generation. This paper aims to be self-contained, introducing all the necessary background on NLP and quantum computing along the way.


Lambek pregroups are Frobenius spiders in preorders

Pavlovic, Dusko

arXiv.org Artificial Intelligence

"Spider" is a nickname of special Frobenius algebras, a fundamental structure from mathematics, physics, and computer science. Pregroups are a fundamental structure from linguistics. Pregroups and spiders have been used together in natural language processing: one for syntax, the other for semantics. It turns out that pregroups themselves can be characterized as pointed spiders in the category of preordered relations, where they naturally arise from grammars. The other way around, preordered spider algebras in general can be characterized as unions of pregroups. This extends the characterization of relational spider algebras as disjoint unions of groups. The compositional framework that emerged with the results suggests new ways to understand and apply the basis structures in machine learning and data analysis.


Representing Matrices Using Algebraic ZX-calculus

Wang, Quanlong

arXiv.org Artificial Intelligence

Matrices are used everywhere in modern science, like machine learning [11] or quantum computing [12], to name a few. Meanwhile, there is a graphical language called ZX-calculus that could also deal with matrix calculations such as matrix multiplication and tensor product [2, 3]. Then there naturally arises a question: why are people bothering with using diagrams for matrix calculations given that matrix technology has been applied with great successes? There are a few reasons for doing so. First, there is a lot of redundancy in matrix calculations which could be avoided in graphical calculus. For example, to prove the cyclic property of matrices tr(AB) tr(BA), all the elements of the two matrices will be involved, while in graphical language like ZXcalculus, the proof of the cyclic property is almost a tautology [4]. Second, matrix calculations always have all the elements of matrices involved, thus a "global" operation, while in ZX-calculus, the operations are just diagram rewriting where only a part of a diagram is replaced by another sub-diagram according to certain rewriting rule, thus essentially a "local" operation which makes things much easier. Finally, graphical calculus is much more intuitive than matrix calculation, therefore a pattern/structure is more probably to be recognised in a graphical formalism. In fact, as a graphical calculus for matrix calculation, ZX-calculus has achieved plenty of successes in the field of quantum computing and information [1, 5, 7, 13] For research realm beyond quantum, traditional ZX-calculus [2] is inconvenient as "it lacks a way to directly encode the complex numbers" [18].


lambeq: An Efficient High-Level Python Library for Quantum NLP

Kartsaklis, Dimitri, Fan, Ian, Yeung, Richie, Pearson, Anna, Lorenz, Robin, Toumi, Alexis, de Felice, Giovanni, Meichanetzidis, Konstantinos, Clark, Stephen, Coecke, Bob

arXiv.org Artificial Intelligence

We present lambeq, the first high-level Python library for Quantum Natural Language Processing (QNLP). The open-source toolkit offers a detailed hierarchy of modules and classes implementing all stages of a pipeline for converting sentences to string diagrams, tensor networks, and quantum circuits ready to be used on a quantum computer. lambeq supports syntactic parsing, rewriting and simplification of string diagrams, ansatz creation and manipulation, as well as a number of compositional models for preparing quantum-friendly representations of sentences, employing various degrees of syntax sensitivity. We present the generic architecture and describe the most important modules in detail, demonstrating the usage with illustrative examples. Further, we test the toolkit in practice by using it to perform a number of experiments on simple NLP tasks, implementing both classical and quantum pipelines.